Goto

Collaborating Authors

 discrete entropy



We have made comparison with assorted editing tasks, including face relighting (FR), 2

Neural Information Processing Systems

Illumination maps are visualized: Fig.1 shows the FR of male/female with three different references, and the produced The GNN model of Scarselli et al. (2009) was originally designed for classification or regression under This paper further extend and explore GNN in two aspects. Then, we use the GNN to unify many significant CV operations from different fields, like Farbman's GNN can be controlled by our framework, we propose a new kernel structure Eq.(12) with guided feature to construct For FR in Fig.4, if we perform QIA only for the luminance channel of the inputs, we obtain the left output; if we


Bayesian estimation of discrete entropy with mixtures of stick-breaking priors

Neural Information Processing Systems

We consider the problem of estimating Shannon's entropy H in the under-sampled regime, where the number of possible symbols may be unknown or countably infinite. Pitman-Yor processes (a generalization of Dirichlet processes) provide tractable prior distributions over the space of countably infinite discrete distributions, and have found major applications in Bayesian non-parametric statistics and machine learning. Here we show that they also provide natural priors for Bayesian entropy estimation, due to the remarkable fact that the moments of the induced posterior distribution over H can be computed analytically. We derive formulas for the posterior mean (Bayes' least squares estimate) and variance under such priors. Moreover, we show that a fixed Dirichlet or Pitman-Yor process prior implies a narrow prior on H, meaning the prior strongly determines the entropy estimate in the under-sampled regime.


Bayesian estimation of discrete entropy with mixtures of stick-breaking priors

Archer, Evan, Park, Il Memming, Pillow, Jonathan W.

Neural Information Processing Systems

We consider the problem of estimating Shannon's entropy H in the under-sampled regime, where the number of possible symbols may be unknown or countably infinite. Pitman-Yor processes (a generalization of Dirichlet processes) provide tractable prior distributions over the space of countably infinite discrete distributions, and have found major applications in Bayesian non-parametric statistics and machine learning. Here we show that they also provide natural priors for Bayesian entropy estimation, due to the remarkable fact that the moments of the induced posterior distribution over H can be computed analytically. We derive formulas for the posterior mean (Bayes' least squares estimate) and variance under such priors. Moreover, we show that a fixed Dirichlet or Pitman-Yor process prior implies a narrow prior on H, meaning the prior strongly determines the entropy estimate in the under-sampled regime.